17 research outputs found

    Influence of statistical methods and reference dates on describing temperature change in Alaska

    Get PDF
    Quantifying temperature trends across multiple decades in Alaska is an essential component for informing policy on climate change in the region. However, Alaska's climate is governed by a complex set of drivers operating at various spatial and temporal scales, which we posit should result in a sensitivity of trend estimates to the selection of reference start and end dates as well as the choice of statistical methods employed for quantifying temperature change. As such, this study attempts to address three questions: (1) How sensitive are temperature trend estimates in Alaska to reference start dates? (2) To what degree do methods vary with respect to estimating temperature change in Alaska? and (3) How do different reference start dates and statistical methods respond to climatic events that impact Alaska's temperature? To answer these questions, we examine the use of five methods for quantifying temperature trends at 10 weather stations in Alaska and compare multiple reference start dates from 1958 to 1993 while using a single reference end date of 2003. The results from this analysis demonstrate that, with some methods, the discrepancy in temperature trend estimates between consecutive start dates can be larger than the overall temperature change reported for the second half of the 20th century. Second, different methods capture different climatic patterns, thus influencing temperature trend estimates. Third, temperature trend estimation varies more significantly when a reference start date is defined by an extreme temperature. These findings emphasize that sensitivity analyses should be an essential component in estimating multidecadal temperature trends and that comparing estimates derived from different methods should be performed with caution. Furthermore, the ability to describe temperature change using current methods may be compromised given the increase in temperature extremes in contemporary climate change

    3.0 T cardiovascular magnetic resonance in patients treated with coronary stenting for myocardial infarction: evaluation of short term safety and image quality

    Get PDF
    Purpose To evaluate safety and image quality of cardiovascular magnetic resonance (CMR) at 3.0 T in patients with coronary stents after myocardial infarction (MI), in comparison to the clinical standard at 1.5 T. Methods Twenty-five patients (21 men; 55 ± 9 years) with first MI treated with primary stenting, underwent 18 scans at 3.0 T and 18 scans at 1.5 T. Twenty-four scans were performed 4 ± 2 days and 12 scans 125 ± 23 days after MI. Cine (steady-state free precession) and late gadolinium-enhanced (LGE, segmented inversion-recovery gradient echo) images were acquired. Patient safety and image artifacts were evaluated, and in 16 patients stent position was assessed during repeat catheterization. Additionally, image quality was scored from 1 (poor quality) to 4 (excellent quality). Results There were no clinical events within 30 days of CMR at 3.0 T or 1.5 T, and no stent migration occurred. At 3.0 T, image quality of cine studies was clinically useful in all, but not sufficient for quantitative analysis in 44% of the scans, due to stent (6/18 scans), flow (7/18 scans) and/or dark band artifacts (8/18 scans). Image quality of LGE images at 3.0 T was not sufficient for quantitative analysis in 53%, and not clinically useful in 12%. At 1.5 T, all cine and LGE images were quantitatively analyzable. Conclusion 3.0 T is safe in the acute and chronic phase after MI treated with primary stenting. Although cine imaging at 3.0 T is suitable for clinical use, quantitative analysis and LGE imaging is less reliable than at 1.5 T. Further optimization of pulse sequences at 3.0 T is essential

    The prevalence of Giardia infection in dogs and cats, a systematic review and meta-analysis of prevalence studies from stool samples

    Get PDF
    Giardia has a wide range of host species and is a common cause of diarrhoeal disease in humans and animals. Companion animals are able to transmit a range of zoonotic diseases to their owners including giardiasis, but the size of this risk is not well known. The aim of this study was to analyse giardiasis prevalence rates in dogs and cats worldwide using a systematic search approach. Meta-analysis enabled to describe associations between Giardia prevalence and various confounding factors. Pooled prevalence rates were 15.2% (95% CI 13.8-16.7%) for dogs and 12% (95% CI 9.2-15.3%) for cats. However, there was very high heterogeneity between studies. Meta-regression showed that the diagnostic method used had a major impact on reported prevalence with studies using ELISA, IFA and PCR reporting prevalence rates between 2.6 and 3.7 times greater than studies using microscopy. Conditional negative binomial regression found that symptomatic animals had higher prevalence rates ratios (PRR) than asymptomatic animals 1.61 (95% CI 1.33-1.94) in dogs and 1.94 (95% CI 1.47-2.56) in cats. Giardia was much more prevalent in young animals. For cats >6 months, PRR=0.47 (0.42-0.53) and in dogs of the same age group PRR=0.36 (0.32-0.41). Additionally, dogs kept as pets were less likely to be positive (PRR=0.56 (0.41-0.77)) but any difference in cats was not significant. Faecal excretion of Giardia is common in dogs and slightly less so in cats. However, the exact rates depend on the diagnostic method used, the age and origin of the animal. What risk such endemic colonisation poses to human health is still unclear as it will depend not only on prevalence rates but also on what assemblages are excreted and how people interact with their pets

    Predicting Academic Performance: A Systematic Literature Review

    Get PDF
    The ability to predict student performance in a course or program creates opportunities to improve educational outcomes. With effective performance prediction approaches, instructors can allocate resources and instruction more accurately. Research in this area seeks to identify features that can be used to make predictions, to identify algorithms that can improve predictions, and to quantify aspects of student performance. Moreover, research in predicting student performance seeks to determine interrelated features and to identify the underlying reasons why certain features work better than others. This working group report presents a systematic literature review of work in the area of predicting student performance. Our analysis shows a clearly increasing amount of research in this area, as well as an increasing variety of techniques used. At the same time, the review uncovered a number of issues with research quality that drives a need for the community to provide more detailed reporting of methods and results and to increase efforts to validate and replicate work.Peer reviewe

    Law and justice for a global world: a philosophy of law and justice

    No full text
    We can at once sum up what has to follow here by saying that law and justice are a kind of culture, a way of life, a mentality, which is also called “civilized” and which is the contrary of the lazy, the dirty, the ignorant, the arbitrary, of which the rule by the merely stronger or richer is but one example. Therefore it is not to be spoken of e.g. “democracy and the rule of law” etc., but at best, of the rule of law and after that may be of “democracy”, namely then and only then, when there are people who are actually capable to achieve the rule of law through it, which does not necessarily happen by advertisements and incessant talking. Worse, the latter is ever since a strong indication that it is just then the contrary, which applies. Such a culture requires that the basic principles of Civilization, Justice and Law are widely understood and therefore organically applied by everybody in daily life. “Prosperity in Justice” is the purpose of any community or society that has a claim to civilization. Justice includes, as we shall see, order, peace and security. Particular forms of government are but means to this end or instruments, which are supposed to achieve that purpose. If they don’t, they are of no use, whatever glorious names they might carry. It is obvious that such a culture requires well-qualified, well-informed and basically well-intentioned people. Those do not “grow on trees”. There is no quick fix. Shortcuts, and be they accompanied by hefty professions of goodness, lead quickly to deception. People of such a culture have to be made aware, instructed and confronted with a certain amount of experience in the field. Then they can serve as examples and extend these “values” throughout society. That is a demanding task. Yet it underlies the law if its exercise does not only exist of empty words and an exaggerated dance around the “golden calf”. That may sound idealist. Yet on a broader level it turns out to be of rather practical, i.e. very real quality as a look into the newspaper quickly shows. And that which we call practical in every day life shall be dealt with here too

    Future snowfall in the Alps: projections based on the EURO-CORDEX regional climate models

    No full text
    Twenty-first century snowfall changes over the European Alps are assessed based on high-resolution regional climate model (RCM) data made available through the EURO-CORDEX initiative. Fourteen different combinations of global and regional climate models with a target resolution of 12 km and two different emission scenarios are considered. As raw snowfall amounts are not provided by all RCMs, a newly developed method to separate snowfall from total precipitation based on near-surface temperature conditions and accounting for subgrid-scale topographic variability is employed. The evaluation of the simulated snowfall amounts against an observation-based reference indicates the ability of RCMs to capture the main characteristics of the snowfall seasonal cycle and its elevation dependency but also reveals considerable positive biases especially at high elevations. These biases can partly be removed by the application of a dedicated RCM bias adjustment that separately considers temperature and precipitation biases. Snowfall projections reveal a robust signal of decreasing snowfall amounts over most parts of the Alps for both emission scenarios. Domain and multi-model mean decreases in mean September–May snowfall by the end of the century amount to −25 and −45 % for representative concentration pathway (RCP) scenarios RCP4.5 and RCP8.5, respectively. Snowfall in low-lying areas in the Alpine forelands could be reduced by more than −80 %. These decreases are driven by the projected warming and are strongly connected to an important decrease in snowfall frequency and snowfall fraction and are also apparent for heavy snowfall events. In contrast, high-elevation regions could experience slight snowfall increases in midwinter for both emission scenarios despite the general decrease in the snowfall fraction. These increases in mean and heavy snowfall can be explained by a general increase in winter precipitation and by the fact that, with increasing temperatures, climatologically cold areas are shifted into a temperature interval which favours higher snowfall intensities. In general, percentage changes in snowfall indices are robust with respect to the RCM postprocessing strategy employed: similar results are obtained for raw, separated, and separated–bias-adjusted snowfall amounts. Absolute changes, however, can differ among these three methods.ISSN:1994-0416ISSN:1994-042
    corecore